Implement node output destructuring#3806
Implement node output destructuring#3806JustJ01 wants to merge 8 commits intoGraphiteEditor:masterfrom
Conversation
Summary of ChangesHello @JustJ01, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request finalizes the implementation of node output destructuring, a feature that significantly enhances the granularity and flexibility of node interactions. By allowing complex struct outputs to be broken down into their constituent fields, it simplifies data flow and improves the usability of nodes within the graph, making it easier to connect specific data points without intermediate processing steps. Highlights
Changelog
Activity
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
|
Please rebase on the latest upstream master. |
There was a problem hiding this comment.
Code Review
The pull request introduces functionality for node output destructuring, allowing for the extraction of individual fields from a node's output. This involves adding output_fields to NodeMetadata, implementing a Destruct trait, and generating extractor nodes via a new destruct.rs module. The changes appear well-structured and integrate smoothly with the existing macro system. The OnceLock usage for protonode_identifier has been updated to correctly handle &'static str by leaking the boxed string, which is a common pattern for static string storage in Rust. The new deconstruct_output attribute in NodeFnAttributes provides a clear way to enable this feature for specific nodes.
e27f090 to
e215927
Compare
editor/src/messages/portfolio/document/node_graph/document_node_definitions.rs
Outdated
Show resolved
Hide resolved
|
Also, please configure your ai / editor / git to automatically run |
|
@TrueDoctor I have made the requested changes which u asked for |
|
|
||
| #[cfg(feature = "std")] | ||
| #[node_macro::node(name("Split Channels"), category("Raster: Channels"), deconstruct_output)] | ||
| fn split_channels(_: impl Ctx, input: Table<Raster<CPU>>) -> SplitChannelsOutput { |
There was a problem hiding this comment.
Each call to extract_channel iterates over the entire raster. This means the image is traversed 4 times. Maybe do a single-pass implementation ?
Maybe do something like this
fn split_channels(_: impl Ctx, input: Table<Raster<CPU>>) -> SplitChannelsOutput {
let (red, green, blue, alpha) = input.into_four_channels(); // hypothetical single-pass function
SplitChannelsOutput { red, green, blue, alpha }
}
I can commit this function and implimentation if you'd like
There was a problem hiding this comment.
I think in this case we should not touch the split channels node at all since it makes the code less efficient, this was done as per Keavons request.
| static FIELDS: std::sync::OnceLock<Vec<#core_types::registry::StructField>> = std::sync::OnceLock::new(); | ||
| FIELDS.get_or_init(|| vec![ | ||
| #(#output_fields,)* | ||
| ]).as_slice() |
There was a problem hiding this comment.
Why do you use heap memory + syncronization here instead of constructing a slice in place?
|
|
||
| fn generate_extractor_node(core_types: &TokenStream2, fn_name: &syn::Ident, struct_name: &syn::Ident, field_name: &syn::Ident, ty: &Type, output_name: &LitStr) -> TokenStream2 { | ||
| quote! { | ||
| #[node_macro::node(category(""), name(#output_name))] |
There was a problem hiding this comment.
Why do we add the name() annotation here?
| fn parse_output_name(attrs: &[syn::Attribute]) -> syn::Result<Option<String>> { | ||
| let mut output_name = None; | ||
|
|
||
| for attr in attrs { | ||
| if !attr.path().is_ident("output") { | ||
| continue; | ||
| } | ||
|
|
||
| let mut this_output_name = None; | ||
| match &attr.meta { | ||
| Meta::Path(_) => { | ||
| return Err(Error::new_spanned(attr, "Expected output metadata like #[output(name = \"Result\")]")); | ||
| } | ||
| Meta::NameValue(_) => { | ||
| return Err(Error::new_spanned(attr, "Expected output metadata like #[output(name = \"Result\")]")); | ||
| } | ||
| Meta::List(_) => { | ||
| attr.parse_nested_meta(|meta| { | ||
| if meta.path.is_ident("name") { | ||
| if this_output_name.is_some() { | ||
| return Err(meta.error("Multiple output names provided for one field")); | ||
| } | ||
| let value = meta.value()?; | ||
| let lit: LitStr = value.parse()?; | ||
| this_output_name = Some(lit.value()); | ||
| Ok(()) | ||
| } else { | ||
| Err(meta.error("Unsupported output metadata. Supported syntax is #[output(name = \"...\")]")) | ||
| } | ||
| })?; | ||
| } | ||
| } | ||
|
|
||
| let this_output_name = this_output_name.ok_or_else(|| Error::new_spanned(attr, "Missing output name. Use #[output(name = \"...\")]"))?; | ||
| if output_name.is_some() { | ||
| return Err(Error::new_spanned(attr, "Multiple #[output(...)] attributes are not allowed on one field")); | ||
| } | ||
| output_name = Some(this_output_name); | ||
| } | ||
|
|
||
| Ok(output_name) | ||
| } |
There was a problem hiding this comment.
Is this function only used for overriding the field name? I don't think this code is currently used anywhere right? If it is indeed not used, it should be removed to reduce the code complexity
| let use_memo = !available_output_fields.is_empty() | ||
| && node_registry.get(&memo_node).is_some_and(|memo_implementations| { | ||
| memo_implementations | ||
| .iter() | ||
| .any(|(_, node_io)| node_io.call_argument == *input_type && node_io.return_value == first_node_io.return_value) | ||
| }); |
There was a problem hiding this comment.
use_memo will currently always be false since you did not the new structs to the memo node implementations
| inputs: vec![NodeInput::node(source_node_id, 0)], | ||
| call_argument: input_type.clone(), | ||
| implementation: DocumentNodeImplementation::ProtoNode(memo_node.clone()), | ||
| visible: false, |
There was a problem hiding this comment.
there is no need to set visible to false since the entire network is hidden anyway
| DocumentNode { | ||
| inputs: vec![NodeInput::node(source_node_id, 0)], | ||
| call_argument: input_type.clone(), | ||
| implementation: DocumentNodeImplementation::ProtoNode(memo_node.clone()), |
There was a problem hiding this comment.
shouldn't we get a static string for the identifier? why do we need heap allocations here?
|
@TrueDoctor I have started the AI code review. It will take a few minutes to complete. |
There was a problem hiding this comment.
No issues found across 10 files
Since this is your first cubic review, here's how it works:
- cubic automatically reviews your code and comments on bugs and improvements
- Teach cubic by replying to its comments. cubic learns from your replies and gets better over time
- Add one-off context when rerunning by tagging
@cubic-dev-aiwith guidance or docs links (includingllms.txt) - Ask questions if you need clarification on any suggestion
d6228da to
e58c1de
Compare
9b97ab7 to
2e842cb
Compare
There was a problem hiding this comment.
1 issue found across 2 files (changes from recent commits).
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="node-graph/node-macro/src/destruct.rs">
<violation number="1" location="node-graph/node-macro/src/destruct.rs:5">
P1: Custom agent: **PR title enforcement**
PR title must be imperative (e.g., "Complete node output destructuring implementation"), but it uses past tense "Completed", which violates the PR title enforcement rules.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
3adba15 to
958810d
Compare
| // Exclude nodes with destructured outputs since their definition networks already contain the complete structure | ||
| let mut substitutions = preprocessor::generate_node_substitutions(); | ||
| let node_metadata = graphene_std::registry::NODE_METADATA.lock().unwrap(); | ||
| substitutions.retain(|id, _| node_metadata.get(id).is_none_or(|meta| meta.output_fields.is_empty())); | ||
| drop(node_metadata); | ||
| substitutions |
There was a problem hiding this comment.
why is this done here instea of in the generate_node_substitutions method?
There was a problem hiding this comment.
This is all out of my wheelhouse. If you're able to take it from here, I'd appreciate that. I basically just attempted to get it to work from a QA perspective without really understanding what's going on internally, since the alternative would have been to close the PR outright. Sorry I don't know the answer to the question, although it's probably worth fixing if it doesn't match your expectations.
There was a problem hiding this comment.
I mean you made the changes you should be able to address comments regarding them. the generate substitutions functions is used in other places as well, and it looks to me like the changes should probably apply to all callers so it would likely make sense to integrate the changes in the function (or even better instead of first generating the substitutions and then filtering them out, we should just not generate them in the first place) I won't have time to take over the pr myself
From the discord discussions: https://discord.com/channels/731730685944922173/1210129484255076354/1475229902788628573